The Ability of Word Embeddings to Capture Word Similarities

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unsupervised Word Mapping Using Structural Similarities in Monolingual Embeddings

Most existing methods for automatic bilingual dictionary induction rely on prior alignments between the source and target languages, such as parallel corpora or seed dictionaries. For many language pairs, such supervised alignments are not readily available. We propose an unsupervised approach for learning a bilingual dictionary for a pair of languages given their independently-learned monoling...

متن کامل

Supervised Word Sense Disambiguation with Sentences Similarities from Context Word Embeddings

In this paper, we propose a method that employs sentences similarities from context word embeddings for supervised word sense disambiguation. In particular, if N example sentences exist in training data, an N-dimensional vector with N similarities between each pair of example sentences is added to a basic feature vector. This new feature vector is used to train a classifier and identification. ...

متن کامل

Material to “ Dynamic Word Embeddings ”

L=10 vocabulary size L′=103 batch size for smoothing d=100 embedding dimension for SoU and Twitter d=200 embedding dimension for Google books Ntr =5000 number of training steps for each t (filtering) N ′ tr =5000 number of pretraining steps with minibatch sampling (smoothing; see Algorithm 2) Ntr =1000 number of training steps without minibatch sampling (smoothing; see Algorithm 2) cmax =4 cont...

متن کامل

Word Embeddings with Multiple Word Prototypes

The ability to accurately represent word vectors to capture syntactic and semantic similarity is central to Natural language processing. Thus, there is rising interest in vector space word embeddings and their use especially given recent methods for their fast estimation at very large scale. However almost all recent works assume a single representation for each word type, completely ignoring p...

متن کامل

Dependency-Based Word Embeddings

While continuous word embeddings are gaining popularity, current models are based solely on linear contexts. In this work, we generalize the skip-gram model with negative sampling introduced by Mikolov et al. to include arbitrary contexts. In particular, we perform experiments with dependency-based contexts, and show that they produce markedly different embeddings. The dependencybased embedding...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal on Natural Language Computing

سال: 2020

ISSN: 2319-4111

DOI: 10.5121/ijnlc.2020.9302